650 research outputs found

    Getting Beneath the Veil of Effective Schools: Evidence from New York City

    Get PDF
    Charter schools were developed, in part, to serve as an R&D engine for traditional public schools, resulting in a wide variety of school strategies and outcomes. In this paper, we collect unparalleled data on the inner-workings of 35 charter schools and correlate these data with credible estimates of each school's effectiveness. We find that traditionally collected input measures -- class size, per pupil expenditure, the fraction of teachers with no certification, and the fraction of teachers with an advanced degree -- are not correlated with school effectiveness. In stark contrast, we show that an index of five policies suggested by over forty years of qualitative research -- frequent teacher feedback, the use of data to guide instruction, high-dosage tutoring, increased instructional time, and high expectations -- explains approximately 50 percent of the variation in school effectiveness. Our results are robust to controls for three alternative theories of schooling: a model emphasizing the provision of wrap-around services, a model focused on teacher selection and retention, and the "No Excuses'' model of education. We conclude by showing that our index provides similar results in a separate sample of charter schools.

    Comparing global optimization and default settings of stream-based joins

    Get PDF
    One problem encountered in real-time data integration is the join of a continuous incoming data stream with a disk-based relation. In this paper we investigate a stream-based join algorithm, called mesh join (MESHJOIN), and focus on a critical component in the algorithm, called the disk-buffer. In MESHJOIN the size of disk-buffer varies with a change in total memory budget and tuning is required to get the maximum service rate within limited available memory. Until now there was little data on the position of the optimum value depending on the memory size, and no performance comparison has been carried out between the optimum and reasonable default sizes for the disk-buffer. To avoid tuning, we propose a reasonable default value for the disk-buffer size with a small and acceptable performance loss. The experimental results validate our arguments

    An event-based near real-time data integration architecture

    Get PDF
    Extract-Transform-Load (ETL) tools feed data from operational databases into data warehouses. Traditionally, these ETL tools use batch processing and operate offline at regular time intervals, for example on a nightly or weekly basis. Naturally, users prefer to have up-to-date data to make their decisions, therefore there is a demand for real-time ETL tools. In this paper we investigate an event-based near real-time ETL layer for transferring and transforming data from the operational database to the data warehouse. One of our main concerns in this paper is master data management in the ETL layer. We present the architecture of a novel, general purpose, event-driven, and near real-time ETL layer that uses a Database Queue (DBQ), works on a push technology principle and directly supports content enrichment. We also observe that the system architecture is consistent with the information architecture of a classical Online Transaction Processing (OLTP) application, allowing us to distinguish between different kinds of data to increase the clarity of the design. Keywords: event-based architecture, content enrichment, master data, extract-transform-load, enterprise service bus

    HYBRIDJOIN for near-real-time Data Warehousing

    Get PDF
    An important component of near-real-time data warehouses is the near-real-time integration layer. One important element in near-real-time data integration is the join of a continuous input data stream with a diskbased relation. For high-throughput streams, stream-based algorithms, such as Mesh Join (MESHJOIN), can be used. However, in MESHJOIN the performance of the algorithm is inversely proportional to the size of disk-based relation. The Index Nested Loop Join (INLJ) can be set up so that it processes stream input, and can deal with intermittences in the update stream but it has low throughput. This paper introduces a robust stream-based join algorithm called Hybrid Join (HYBRIDJOIN), which combines the two approaches. A theoretical result shows that HYBRIDJOIN is asymptotically as fast as the fastest of both algorithms. The authors present performance measurements of the implementation. In experiments using synthetic data based on a Zipfian distribution, HYBRIDJOIN performs significantly better for typical parameters of the Zipfian distribution, and in general performs in accordance with the theoretical model while the other two algorithms are unacceptably slow under different settings

    Are High Quality Schools Enough to Close the Achievement Gap? Evidence from a Social Experiment in Harlem

    Get PDF
    Harlem Children’s Zone (HCZ), which combines community investments with reform minded charter schools, is one of the most ambitious social experiments to alleviate poverty of our time. We provide the first empirical test of the causal impact of HCZ on educational outcomes, with an eye toward informing the long-standing debate whether schools alone can eliminate the achievement gap or whether the issues that poor children bring to school are too much for educators alone to overcome. Both lottery and instrumental variable identification strategies lead us to the same story: Harlem Children’s Zone is effective at increasing the achievement of the poorest minority children. Taken at face value, the effects in middle school are enough to close the black-white achievement gap in mathematics and reduce it by nearly half in English Language Arts. The effects in elementary school close the racial achievement gap in both subjects. We conclude by presenting four pieces of evidence that high-quality schools or high-quality schools coupled with community investments generate the achievement gains. Community investments alone cannot explain the results.

    The Impact of Youth Service on Future Outcomes: Evidence from Teach For America

    Get PDF
    Nearly one million American youth have participated in service programs such as Peace Corps and Teach For America. This paper provides the first causal estimate of the impact of service programs on those who serve, using data from a web-based survey of former Teach For America applicants. We estimate the effect of voluntary youth service using a sharp discontinuity in the Teach For America application process. Participating in Teach For America increases racial tolerance, makes individuals more optimistic about the life chances of poor children, and makes them more likely to work in education. We argue that these facts are broadly consistent with the “Contact Hypothesis,” which states that, under appropriate conditions, interpersonal contact can reduce prejudice.

    Exam High Schools and Academic Achievement: Evidence from New York City

    Get PDF
    Publicly funded exam schools educate many of the world's most talented students. These schools typically contain higher achieving peers, more rigorous instruction, and additional resources compared to regular public schools. This paper uses a sharp discontinuity in the admissions process at three prominent exam schools in New York City to provide the first causal estimate of the impact of attending an exam school in the United States on longer term academic outcomes. Attending an exam school increases the rigor of high school courses taken and the probability that a student graduates with an advanced high school degree. Surprisingly, however, attending an exam school has little impact on Scholastic Aptitude Test scores, college enrollment, or college graduation -- casting doubt on their ultimate long term impact.

    A high Eddington-ratio, true Seyfert 2 galaxy candidate: implications for broad-line-region models

    Full text link
    A bright, soft X-ray source was detected on 2010 July 14 during an XMM--Newton slew at a position consistent with the galaxy GSN 069 (z=0.018). Previous ROSAT observations failed to detect the source and imply that GSN 069 is now >240 times brighter than it was in 1994 in the soft X-ray band. We report here results from a ~1 yr monitoring with Swift and XMM-Newton, as well as from optical spectroscopy. GSN 069 is an unabsorbed, ultra-soft source in X-rays, with no flux detected above ~1 keV. The soft X-rays exhibit significant variability down to timescales of hundreds of seconds. The UV-to-X-ray spectrum of GSN 069 is consistent with a pure accretion disc model which implies an Eddington ratio of ~0.5 and a black hole mass of ~ 1.2 million solar masses. A new optical spectrum, obtained ~3.5 months after the XMM-Newton slew detection, is consistent with earlier spectra and lacks any broad line component, classifying the source as a Seyfert 2 galaxy. The lack of cold X-ray absorption and the short timescale variability in the soft X-rays rule out a standard Seyfert 2 interpretation of the X-ray data. We discuss our results within the framework of two possible scenarios for the broad-line-region (BLR) in AGN, namely the two-phase model (cold BLR clouds in pressure equilibrium with a hotter medium), and models in which the BLR is part of an outflow, or disc-wind. Finally, we point out that GSN 069 may be a member of a population of super-soft AGN whose SED is completely dominated by accretion disc emission, as it is the case in some black hole X-ray binary transients during their outburst evolution. The disc emission for a typical AGN with larger black hole mass than GSN 069 does not enters the soft X-ray band, so that GSN 069-like objects would likely be missed by current X-ray surveys, or mis-classified as Compton-thick candidates. (ABRIDGED)Comment: Accepted for publication in MNRA
    corecore